Variational Methods in Convex Analysis

نویسندگان

  • Jonathan M. Borwein
  • Qiji J. Zhu
چکیده

We use variational methods to provide a concise development of a number of basic results in convex and functional analysis. This illuminates the parallels between convex analysis and smooth subdifferential theory. 1. The purpose of this note is to give a concise and explicit account of the following folklore: several fundamental theorems in convex analysis such as the sandwich theorem and the Fenchel duality theorem may usefully be proven by variational arguments. Many important results in linear functional analysis can then be easily deduced as special cases. These are entirely parallel to the basic calculus of smooth subdifferential theory. Some of these relationships have already been discussed in [1, 2, 5, 6, 12, 18]. 2. By a ‘variational argument’ we connote a proof with two main components: (a) an argument that an appropriate auxiliary function attains its minimum and (b) a ‘decoupling’ mechanism in a sense we make precise below. It is well known that this methodology lies behind many basic results of smooth subdifferential theory [6, 20]. It is known, but not always made explicit, that this is equally so in convex analysis. Here we record in an organized fashion that this method also lies behind most of the important theorems in convex analysis. In convex analysis the role of (a) is usually played by the following theorem attributed to Fenchel and Rockafellar (among others) for which some preliminaries are needed. Let X be a real locally convex topological vector space. Recall that the domain of an extended valued convex function f on X (denoted dom f) is the set of points with value less than +∞. A subset T of X is absorbing if X = λ>0 λT and a point s is in the core of a set S ⊂ X (denoted by s ∈ core S) provided that S − s is absorbing and s ∈ S. A symmetric, convex, closed and absorbing subset of X is called a barrel. We say X is barrelled if every barrel of X is a neighborhood of zero. All Baire—and hence all complete metrizable—locally convex spaces are barrelled, but not conversely. Recall that x∗ ∈ X∗, the topological dual, is a subgradient of f : X → (−∞, +∞] at x ∈ dom f provided that f(y)− f(x) ≥ 〈x∗, y − x〉. The set of all subgradients of f at x is called the subdifferential of f at x and is denoted ∂f(x). We use the standard convention that ∂f(x) = ∅ for x 6∈ dom f . We use contf to denote the set of all continuity points of f . Date: November 22, 2006.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

The Solvability of Concave-Convex Quasilinear Elliptic Systems Involving $p$-Laplacian and Critical Sobolev Exponent

In this work, we study the existence of non-trivial multiple solutions for a class of quasilinear elliptic systems equipped with concave-convex nonlinearities and critical growth terms in bounded domains. By using the variational method, especially Nehari manifold and Palais-Smale condition, we prove the existence and multiplicity results of positive solutions.

متن کامل

$(varphi_1, varphi_2)$-variational principle

In this paper we prove that if $X $ is a Banach space, then for every lower semi-continuous bounded below function $f, $ there exists a $left(varphi_1, varphi_2right)$-convex function $g, $ with arbitrarily small norm,  such that $f + g $ attains its strong minimum on $X. $ This result extends some of the  well-known varitional principles as that of Ekeland [On the variational principle,  J. Ma...

متن کامل

Strong convergence theorem for a class of multiple-sets split variational inequality problems in Hilbert spaces

In this paper, we introduce a new iterative algorithm for approximating a common solution of certain class of multiple-sets split variational inequality problems. The sequence of the proposed iterative algorithm is proved to converge strongly in Hilbert spaces. As application, we obtain some strong convergence results for some classes of multiple-sets split convex minimization problems.

متن کامل

Sequential Optimality Conditions and Variational Inequalities

In recent years, sequential optimality conditions are frequently used for convergence of iterative methods to solve nonlinear constrained optimization problems. The sequential optimality conditions do not require any of the constraint qualications. In this paper, We present the necessary sequential complementary approximate Karush Kuhn Tucker (CAKKT) condition for a point to be a solution of a ...

متن کامل

Extensions of Saeidi's Propositions for Finding a Unique Solution of a Variational Inequality for $(u,v)$-cocoercive Mappings in Banach Spaces

Let $C$ be a nonempty closed convex subset of a real Banach space $E$, let $B: C rightarrow E $ be a nonlinear map, and let  $u, v$ be  positive numbers. In this paper, we show  that  the generalized variational inequality $V I (C, B)$ is singleton for $(u, v)$-cocoercive mappings under appropriate assumptions on Banach spaces. The main results are extensions of the Saeidi's Propositions for fi...

متن کامل

Variational inequalities on Hilbert $C^*$-modules

We introduce variational inequality problems on Hilbert $C^*$-modules and we prove several existence results for variational inequalities defined on closed convex sets. Then relation between variational inequalities, $C^*$-valued metric projection and fixed point theory  on  Hilbert $C^*$-modules is studied.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • J. Global Optimization

دوره 35  شماره 

صفحات  -

تاریخ انتشار 2006